skip to main content


Search for: All records

Creators/Authors contains: "Kubota, Alyssa"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Many robot-delivered health interventions aim to support people longitudinally at home to complement or replace in-clinic treat- ments. However, there is little guidance on how robots can support collaborative goal setting (CGS). CGS is the process in which a person works with a clinician to set and modify their goals for care; it can improve treatment adherence and efficacy. However, for home-deployed robots, clinicians will have limited availability to help set and modify goals over time, which necessitates that robots support CGS on their own. In this work, we explore how robots can facilitate CGS in the context of our robot CARMEN (Cognitively Assistive Robot for Motivation and Neurorehabilitation), which delivers neurorehabilitation to people with mild cognitive impairment (PwMCI). We co-designed robot behaviors for supporting CGS with clinical neuropsychologists and PwMCI, and prototyped them on CARMEN. We present feedback on how PwMCI envision these behaviors supporting goal progress and motivation during an intervention. We report insights on how to support this process with home-deployed robots and propose a framework to support HRI researchers interested in exploring this both in the context of cognitively assistive robots and beyond. This work supports design- ing and implementing CGS on robots, which will ultimately extend the efficacy of robot-delivered health interventions. 
    more » « less
  2. null (Ed.)
    11% of adults report experiencing cognitive decline which can im- pact memory, behavior, and physical abilities. Robots have great potential to support people with cognitive impairments, their caregivers, and clinicians by facilitating treatments such as cognitive neurorehabilitation. Personalizing these treatments to individual preferences and goals is critical to improving engagement and adherence, which helps improve treatment efficacy. In our work, we explore the efficacy of robot-assisted neurorehabilitation and aim to enable robots to adapt their behavior to people with cognitive impairments, a unique population whose preferences and abilities may change dramatically during treatment. Our work aims to en- able more engaging and personalized interactions between people and robots, which can profoundly impact robot-assisted treatment, how people receive care, and improve their everyday lives. 
    more » « less
  3. null (Ed.)
    An estimated 11% of adults report experiencing some form of cognitive decline which may be associated with conditions such as stroke or dementia, and can impact their memory, cognition, behavior, and physical abilities. While there are no known pharmacological treatments for many of these conditions, behavioral treatments such as cognitive training can prolong the independence of people with cognitive impairments. These treatments teach metacognitive strategies to compensate for memory difficulties in their everyday lives. Personalizing these treatments to suit the preferences and goals of an individual is critical to improving their engagement and sustainment, as well as maximizing the treatment’s effectiveness. Robots have great potential to facilitate these training regimens and support people with cognitive impairments, their caregivers, and clinicians. This article examines how robots can adapt their behavior to be personalized to an individual in the context of cognitive neurorehabilitation. We provide an overview of existing robots being used to support neurorehabilitation, and identify key principles to working in this space. We then examine state-of-the-art technical approaches to enabling longitudinal behavioral adaptation. To conclude, we discuss our recent work on enabling social robots to automatically adapt their behavior and explore open challenges for longitudinal behavior adaptation. This work will help guide the robotics community as they continue to provide more engaging, effective, and personalized interactions between people and robots. 
    more » « less
  4. JESSIE is a robotic system that enables novice programmers to program social robots by expressing high-level specifications. We employ control synthesis with a tangible front-end to allow users to define complex behavior for which we automatically generate control code. We demonstrate JESSIE in the context of enabling clinicians to create personalized treatments for people with mild cognitive impairment (MCI) on a Kuri robot, in little time and without error. We evaluated JESSIE with neuropsychologists who reported high usability and learnability. They gave suggestions for improvement, including increased support for personalization, multi-party programming, collaborative goal setting, and re-tasking robot role post-deployment, which each raise technical and sociotechnical issues in HRI. We exhibit JESSIE's reproducibility by replicating a clinician-created program on a TurtleBot~2. As an open-source means of accessing control synthesis, JESSIE supports reproducibility, scalability, and accessibility of personalized robots for HRI. 
    more » « less
  5. In this work, we present a novel non-visual HAR system that achieves state-of-the-art performance on realistic SCE tasks via a single wearable sensor. We leverage surface electromyography and inertial data from a low-profile wearable sensor to attain performant robot perception while remaining unobtrusive and user-friendly. By capturing both convolutional and temporal features with a hybrid CNN-LSTM classifier, our system is able to robustly and effectively classify complex, full-body human activities with only this single sensor. We perform a rigorous analysis of our method on two datasets representative of SCE tasks, and compare performance with several prominent HAR algorithms. Results show our system substantially outperforms rival algorithms in identifying complex human tasks from minimal sensing hardware, achieving F1-scores up to 84% over 31 strenuous activity classes. To our knowledge, we are the first to robustly identify complex full-body tasks using a single, unobtrusive sensor feasible for real-world use in SCEs. Using our approach, robots will be able to more reliably understand human activity, enabling them to safely navigate sensitive, crowded spaces. 
    more » « less
  6. In safety-critical environments, robots need to reliably recognize human activity to be effective and trust-worthy partners. Since most human activity recognition (HAR) approaches rely on unimodal sensor data (e.g. motion capture or wearable sensors), it is unclear how the relationship between the sensor modality and motion granularity (e.g. gross or fine) of the activities impacts classification accuracy. To our knowledge, we are the first to investigate the efficacy of using motion capture as compared to wearable sensor data for recognizing human motion in manufacturing settings. We introduce the UCSD-MIT Human Motion dataset, composed of two assembly tasks that entail either gross or fine-grained motion. For both tasks, we compared the accuracy of a Vicon motion capture system to a Myo armband using three widely used HAR algorithms. We found that motion capture yielded higher accuracy than the wearable sensor for gross motion recognition (up to 36.95%), while the wearable sensor yielded higher accuracy for fine-grained motion (up to 28.06%). These results suggest that these sensor modalities are complementary, and that robots may benefit from systems that utilize multiple modalities to simultaneously, but independently, detect gross and fine-grained motion. Our findings will help guide researchers in numerous fields of robotics including learning from demonstration and grasping to effectively choose sensor modalities that are most suitable for their applications. 
    more » « less